The following content has been provided by the University of Erlangen-Nürnberg.
So good morning. You are the lucky ones because we decided to turn off the lights, which will result in a worse recording of the exercise.
Sorry, not the exercise, the lecture. But you will see the slides much nicer.
My name is Katharina Branninger. Just to start with that, you might have already imagined that I'm not Andreas Meyer.
But I'll give the talk today. And next week there is going to be Vincent Kristlein giving the lecture.
After that you will have the pleasure of actually listening to Andreas Meyer.
So at the moment it's a little past 8.20, but I think all subsequent lectures will start around 8.30,
because it's a little bit more convenient for us and probably also a little bit more convenient for you.
So we will start 50 minutes later and the lecture will then go until 10 usually.
So I think I will wait maybe another two or three minutes to make sure that we don't have too many late joiners.
If you have any immediate questions at the moment, feel free to ask. Are there any questions?
Yeah? It is not absolutely necessary to have pattern recognition in the course.
It's not a hard requirement, but you will find that we use a lot of prerequisites here in this course.
So we can't recap everything that has been done in pattern recognition.
Of course you also don't need everything, but you do need quite a bit of basic knowledge that we cover in this lecture.
You can also simply recap that on your own or go back to these lectures.
We have slides and recordings for all of them.
So if you find yourself struggling with one of the concepts that we present, you might want to go back there and just check or read up.
There are also a lot of wonderful sources on the internet that you can look for and use to make sure that you are up to the task and up to the knowledge that we require here.
Does that answer your question? Good. Any other questions?
So this room is rather large. You are very welcome to ask any questions throughout the lecture.
Simply interrupt me by raising your hand.
This is a big room, so if I don't see you right away, simply wave after some time, not directly, but after some time, to make sure that I see you.
So maybe just to kind of phrase the setting of this lecture, it is an advanced class, so we can't really cover all the basic concepts, but we require you to have some prior knowledge about it.
It should be easy if you've heard pattern recognition to have this kind of prior knowledge, but as I've said, you can also redo that on your own or listen to any kind of online lecture to pattern recognition or machine learning.
That should also give you some kind of idea what we are going to talk about.
We have the lecture's introduction on pattern recognition and pattern recognition as well as pattern analysis at our lab as well.
So these are really sensible to hear previously to this or in front of this lecture, but yeah, you're all master students, I think, or most of you should be master students.
I think you know when to look at something that you haven't heard before.
So let me start. A big welcome again to the lecture, Deep Learning, in the summer semester.
I would like to start with introducing the team that is going to get you through this lecture.
There is, of course, Andreas Meyer, who you will see in two weeks.
Then Suleiman and me. Suleiman is sitting here in the front.
We are going to take over the organizational part with respect to the exercises and the lecture in general.
So if you have any questions regarding the organization or if you have issues with registering, etc., please contact the two of us.
Then we have a number of student tutors who you will see in the exercises throughout the semester.
So there's Florian, Leonid, Noah, Luca, and Zahra.
And the five of them will each have two exercises and will supervise two exercises.
Then we have a number of our colleagues from the lab, Mathis, Christian, who is sitting here in the front.
Oh, I forgot Noah is also sitting here in the front.
Christian, Felix. So Felix and Christian are also here today if you have questions.
And Uelin. So you will, if you need more support, they will also come to the exercises to help you.
Maybe just right away the exercises are going to be demanding.
They will require you to do programming, but I will detail on that later in the lecture.
Good. So let me start with kind of the buzzwords about deep learning.
So you're all, probably most of you are here because deep learning has been quite present in the media for the last few years, I would say.
And has been quite present, especially in this kind of technical environment that we're in.
And a lot of buzzwords have been thrown around in this context.
So let me maybe start with the tasks.
Tasks in deep learning classically relate to classification on one hand, where we want to, where we have a sample and want to associate it with a certain class.
Presenters
Zugänglich über
Offener Zugang
Dauer
01:36:54 Min
Aufnahmedatum
2019-04-25
Hochgeladen am
2019-04-27 08:39:00
Sprache
en-US
Deep Learning (DL) has attracted much interest in a wide range of applications such as image recognition, speech recognition and artificial intelligence, both from academia and industry. This lecture introduces the core elements of neural networks and deep learning, it comprises:
-
(multilayer) perceptron, backpropagation, fully connected neural networks
-
loss functions and optimization strategies
-
convolutional neural networks (CNNs)
-
activation functions
-
regularization strategies
-
common practices for training and evaluating neural networks
-
visualization of networks and results
-
common architectures, such as LeNet, Alexnet, VGG, GoogleNet
-
recurrent neural networks (RNN, TBPTT, LSTM, GRU)
-
deep reinforcement learning
-
unsupervised learning (autoencoder, RBM, DBM, VAE)
-
generative adversarial networks (GANs)
-
weakly supervised learning
-
applications of deep learning (segmentation, object detection, speech recognition, ...)